Self-dual smoothing of convex and saddle functions

نویسنده

  • Rafal Goebel
چکیده

It is shown that any convex function can be approximated by a family of differentiable with Lipschitz continuous gradient and strongly convex approximates in a “self-dual” way: the conjugate of each approximate is the approximate of the conjugate of the original function. The approximation technique extends to saddle functions, and is self-dual with respect to saddle function conjugacy and also partial conjugacy that relates saddle functions to convex functions. Mathemtatics Subject Classification. 52A41, 90C25, 90C59, 90C46, 26B25

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Accelerated gradient sliding for structured convex optimization

Our main goal in this paper is to show that one can skip gradient computations for gradient descent type methods applied to certain structured convex programming (CP) problems. To this end, we first present an accelerated gradient sliding (AGS) method for minimizing the summation of two smooth convex functions with different Lipschitz constants. We show that the AGS method can skip the gradient...

متن کامل

Stochastic Primal-Dual Coordinate Method for Regularized Empirical Risk Minimization

We consider a generic convex optimization problem associated with regularized empirical risk minimization of linear predictors. The problem structure allows us to reformulate it as a convex-concave saddle point problem. We propose a stochastic primal-dual coordinate method, which alternates between maximizing over one (or more) randomly chosen dual variable and minimizing over the primal variab...

متن کامل

Accepted for publication in the Transactions of the American Math. Society (1990) GENERALIZED SECOND DERIVATIVES OF CONVEX FUNCTIONS AND SADDLE FUNCTIONS

The theory of second-order epi-derivatives of extended-real-valued functions is applied to convex functions on lR and shown to be closely tied to proto-differentiation of the corresponding subgradient multifunctions, as well as to second-order epi-differentiation of conjugate functions. An extension is then made to saddle functions, which by definition are convex in one argument and concave in ...

متن کامل

Stochastic Parallel Block Coordinate Descent for Large-Scale Saddle Point Problems

We consider convex-concave saddle point problems with a separable structure and non-strongly convex functions. We propose an efficient stochastic block coordinate descent method using adaptive primal-dual updates, which enables flexible parallel optimization for large-scale problems. Our method shares the efficiency and flexibility of block coordinate descent methods with the simplicity of prim...

متن کامل

Optimality and Duality for an Efficient Solution of Multiobjective Nonlinear Fractional Programming Problem Involving Semilocally Convex Functions

In this paper, the problem under consideration is multiobjective non-linear fractional programming problem involving semilocally convex and related functions. We have discussed the interrelation between the solution sets involving properly efficient solutions of multiobjective fractional programming and corresponding scalar fractional programming problem. Necessary and sufficient optimality...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2007